31 research outputs found

    Emotional Cues during Simultaneous Face and Voice Processing: Electrophysiological Insights

    Get PDF
    Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions

    Male-specific deficits in natural reward learning in a mouse model of neurodevelopmental disorders

    Get PDF
    Neurodevelopmental disorders, including autism spectrum disorders, are highly male biased, but the underpinnings of this are unknown. Striatal dysfunction has been strongly implicated in the pathophysiology of neurodevelopmental disorders, raising the question of whether there are sex differences in how the striatum is impacted by genetic risk factors linked to neurodevelopmental disorders. Here we report male-specific deficits in striatal function important to reward learning in a mouse model of 16p11.2 hemideletion, a genetic mutation that is strongly associated with the risk of neurodevelopmental disorders, particularly autism and attention-deficit hyperactivity disorder. We find that male, but not female, 16p11.2 deletion animals show impairments in reward-directed learning and maintaining motivation to work for rewards. Male, but not female, deletion animals overexpress mRNA for dopamine receptor 2 and adenosine receptor 2a in the striatum, markers of medium spiny neurons signaling via the indirect pathway, associated with behavioral inhibition. Both sexes show a 50% reduction of mRNA levels of the genes located within the 16p11.2 region in the striatum, including the kinase extracellular-signal related kinase 1 (ERK1). However, hemideletion males show increased activation in the striatum for ERK1, both at baseline and in response to sucrose, a signaling change associated with decreased striatal plasticity. This increase in ERK1 phosphorylation is coupled with a decrease in the abundance of the ERK phosphatase striatum-enriched protein-tyrosine phosphatase in hemideletion males. In contrast, females do not show activation of ERK1 in response to sucrose, but notably hemideletion females show elevated protein levels for ERK1 as well as the related kinase ERK2 over what would be predicted by mRNA levels. These data indicate profound sex differences in the impact of a genetic lesion linked with neurodevelopmental disorders, including mechanisms of male-specific vulnerability and female-specific resilience impacting intracellular signaling in the brain

    Emotionale Musikverarbeitung mit dem Cochlea Implantat nach Sprachprozessoranpassung

    No full text
    corecore